Boosting Nearest Neighbor Classifiers for Multiclass Recognition
ثبت نشده
چکیده
This paper introduces an algorithm that uses boosting to learn a distance measure for multiclass k-nearest neighbor classi cation. Given a family of distance measures as input, AdaBoost is used to learn a weighted distance measure, that is a linear combination of the input measures. The proposed method can be seen both as a novel way to learn a distance measure from data, and as a novel way to apply boosting to multiclass recognition problems, that does not require output codes. In our approach, multiclass recognition of objects is reduced into a single binary recognition task, de ned on triples of objects. Preliminary experiments with eight UCI datasets yield no clear winner among our method, boosting using output codes, and knn classi cation using an unoptimized distance measure. Our algorithm did achieve lower error rates in some of the datasets, which indicates that, in some domains, it may lead to better results than existing methods.
منابع مشابه
Stopping Criterion for Boosting-Based Data Reduction Techniques: from Binary to Multiclass Problems
So far, boosting has been used to improve the quality of moderately accurate learning algorithms, by weighting and combining many of their weak hypotheses into a final classifier with theoretically high accuracy. In a recent work (Sebban, Nock and Lallich, 2001), we have attempted to adapt boosting properties to data reduction techniques. In this particular context, the objective was not only t...
متن کاملStopping Criterion for Boosting-Based Data Reduction Techniques: from Binary to Multiclass Problem
So far, boosting has been used to improve the quality of moderately accurate learning algorithms, by weighting and combining many of their weak hypotheses into a final classifier with theoretically high accuracy. In a recent work (Sebban, Nock and Lallich, 2001), we have attempted to adapt boosting properties to data reduction techniques. In this particular context, the objective was not only t...
متن کاملBoosting Nearest Neighbor Classi ers for Multiclass Recognition
This paper introduces an algorithm that uses boosting to learn a distance measure for multiclass k-nearest neighbor classi cation. Given a family of distance measures as input, AdaBoost is used to learn a weighted distance measure, that is a linear combination of the input measures. The proposed method can be seen both as a novel way to learn a distance measure from data, and as a novel way to ...
متن کاملMulticlass Boosting with Adaptive Group-Based kNN and Its Application in Text Categorization
AdaBoost is an excellent committee-based tool for classification. However, its effectiveness and efficiency in multiclass categorization face the challenges from methods based on support vector machine SVM , neural networks NN , naı̈ve Bayes, and k-nearest neighbor kNN . This paper uses a novel multi-class AdaBoost algorithm to avoid reducing the multi-class classification problem to multiple tw...
متن کاملA New AdaBoost Algorithm for Large Scale Classification And Its Application to Chinese Handwritten Character Recognition
The present multiclass boosting algorithms are hard to deal with Chinese handwritten character recognition for the large amount of classes. Most of them are based on schemes of converting multiclass classification to multiple binary classifications and have high training complexity. The proposed multiclass boosting algorithm adopts the descriptive model based multiclass classifiers (Modified Qu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004